Multi-task Active Learning for Pre-trained Transformer-based Models

نویسندگان

چکیده

Abstract Multi-task learning, in which several tasks are jointly learned by a single model, allows NLP models to share information from multiple annotations and may facilitate better predictions when the inter-related. This technique, however, requires annotating same text with annotation schemes, be costly laborious. Active learning (AL) has been demonstrated optimize processes iteratively selecting unlabeled examples whose is most valuable for model. Yet, multi-task active (MT-AL) not applied state-of-the-art pre-trained Transformer-based models. paper aims close this gap. We explore various selection criteria three realistic scenarios, reflecting different relations between participating tasks, demonstrate effectiveness of compared single-task selection. Our results suggest that MT-AL can effectively used order minimize efforts models.1

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Active Task Selection for Multi-Task Learning

In multi-task learning, a learner is given a collection of prediction tasks and needs to solve all of them. In contrast to previous work, which required that annotated training data is available for all tasks, we consider a new setting, in which for some tasks, potentially most of them, only unlabeled training data is provided. Consequently, to solve all tasks, information must be transferred b...

متن کامل

Active Learning for Multi-Task Adaptive Filtering

In this paper, we propose an Active Learning (AL) framework for the Multi-Task Adaptive Filtering (MTAF) problem. Specifically, we explore AL approaches to rapidly improve an MTAF system, based on Dirichlet Process priors, with minimal user/task-level feedback. The proposed AL approaches select instances for delivery with a two-fold objective: 1) Improve future task-specific system performance ...

متن کامل

Multi-Task Active Learning for Linguistic Annotations

We extend the classical single-task active learning (AL) approach. In the multi-task active learning (MTAL) paradigm, we select examples for several annotation tasks rather than for a single one as usually done in the context of AL. We introduce two MTAL metaprotocols, alternating selection and rank combination, and propose a method to implement them in practice. We experiment with a twotask an...

متن کامل

Unregistered Multiview Mammogram Analysis with Pre-trained Deep Learning Models

We show two important findings on the use of deep convolutional neural networks (CNN) in medical image analysis. First, we show that CNN models that are pre-trained using computer vision databases (e.g., Imagenet) are useful in medical image applications, despite the significant differences in image appearance. Second, we show that multiview classification is possible without the pre-registrati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Transactions of the Association for Computational Linguistics

سال: 2022

ISSN: ['2307-387X']

DOI: https://doi.org/10.1162/tacl_a_00515